Partial Multi-Label Learning via Large Margin Nearest Neighbour Embeddings
نویسندگان
چکیده
To deal with ambiguities in partial multi-label learning (PML), existing popular PML research attempts to perform disambiguation by direct ground-truth label identification. However, these approaches can be easily misled noisy false-positive labels the iteration of updating model parameter and latent variables. When labeling information is ambiguous, we should depend more on underlying structure data, such as feature correlations, for partially labeled data. Moreover, large margin nearest neighbour (LMNN) a strategy that considers data classification. due ambiguity PML, traditional LMNN cannot used solve problem directly. In addition, embedding an effective technology decrease noise Inspried technology, propose novel paradigm called Partial Multi-label Learning via Large Margin Nearest Neighbour Embeddings (PML-LMNNE), which aims conduct projecting features into lower-dimension space reorganize simultaneously. An efficient algorithm designed implement proposed method convergence rate analyzed. present theoretical analysis generalization error bound PML-LMNNE, shows converges sum two times Bayes over when number instances goes infinity. Comprehensive experiments artificial real-world datasets demonstrate superiorities PML-LMNNE.
منابع مشابه
Large-Margin Multi-Label Causal Feature Learning
In multi-label learning, an example is represented by a descriptive feature associated with several labels. Simply considering labels as independent or correlated is crude; it would be beneficial to define and exploit the causality between multiple labels. For example, an image label ‘lake’ implies the label ‘water’, but not vice versa. Since the original features are a disorderly mixture of th...
متن کاملLarge-Scale Bayesian Multi-Label Learning via Topic-Based Label Embeddings
We present a scalable Bayesian multi-label learning model based on learning lowdimensional label embeddings. Our model assumes that each label vector is generated as a weighted combination of a set of topics (each topic being a distribution over labels), where the combination weights (i.e., the embeddings) for each label vector are conditioned on the observed feature vector. This construction, ...
متن کاملLarge-margin nearest neighbor classifiers via sample weight learning
The nearest neighbor classification is a simple and yet effective technique for pattern recognition. Performance of this technique depends significantly on the distance function used to compute similarity between examples. Some techniques were developed to learn weights of features for changing the distance structure of samples in nearest neighbor classification. In this paper, we propose an ap...
متن کاملConvergence of Multi-pass Large Margin Nearest Neighbor Metric Learning
Large margin nearest neighbor classification (LMNN) is a popular technique to learn a metric that improves the accuracy of a simple knearest neighbor classifier via a convex optimization scheme. However, the optimization problem is convex only under the assumption that the nearest neighbors within classes remain constant. In this contribution we show that an iterated LMNN scheme (multi-pass LMN...
متن کاملLarge Margin Metric Learning for Multi-Label Prediction
Canonical correlation analysis (CCA) and maximum margin output coding (MMOC) methods have shown promising results for multi-label prediction, where each instance is associated with multiple labels. However, these methods require an expensive decoding procedure to recover the multiple labels of each testing instance. The testing complexity becomes unacceptable when there are many labels. To avoi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence
سال: 2022
ISSN: ['2159-5399', '2374-3468']
DOI: https://doi.org/10.1609/aaai.v36i6.20628